Rényi Entropy and Rényi Divergence in the Intuitionistic Fuzzy Case

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Rényi divergence and majorization

Rényi divergence is related to Rényi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of Rényi divergence, including it...

متن کامل

Rényi Divergence Variational Inference

This paper introduces the variational Rényi bound (VR) that extends traditional variational inference to Rényi’s α-divergences. This new family of variational methods unifies a number of existing approaches, and enables a smooth interpolation from the evidence lower-bound to the log (marginal) likelihood that is controlled by the value of α that parametrises the divergence. The reparameterizati...

متن کامل

Smooth Entropy and Rényi Entropy

The notion of smooth entropy allows a unifying, generalized formulation of privacy ampli-cation and entropy smoothing. Smooth entropy is a measure for the number of almost uniform random bits that can be extracted from a random source by probabilistic algorithms. It is known that the R enyi entropy of order at least 2 of a random variable is a lower bound for its smooth entropy. On the other ha...

متن کامل

Maximum Rényi Entropy Rate

Two maximization problems of Rényi entropy rate are investigated: the maximization over all stochastic processes whose marginals satisfy a linear constraint, and the Burg-like maximization over all stochastic processes whose autocovariance function begins with some given values. The solutions are related to the solutions to the analogous maximization problems of Shannon entropy rate.

متن کامل

Rényi Divergence and Kullback-Leibler Divergence

Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Tatra Mountains Mathematical Publications

سال: 2018

ISSN: 1210-3195

DOI: 10.2478/tmmp-2018-0023